Autoregressive model order selection by a finite sample estimator for the Kullback-Leibler discrepancy

نویسندگان

  • Piet M. T. Broersen
  • H. Einar Wensink
چکیده

In other words, when (u; u(0N); 11 1;u(01)) belongs to U , asymptotically periodic inputs produce asymptotically periodic outputs with the same period. The proof of this theorem makes use of a contraction-mapping fixed-point argument. The techniques used in our omitted proofs are also useful in connection with related problems that are " more nonlinear. " In particular, related results are given in [9] for the discrete-time " quadratic filter " whose output y(0); y(1); 1 11 satisfies y(n) =

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Order selection for vector autoregressive models

Order-selection criteria for vector autoregressive (AR) modeling are discussed. The performance of an order-selection criterion is optimal if the model of the selected order is the most accurate model in the considered set of estimated models: here vector AR models. Suboptimal performance can be a result of underfit or overfit. The Akaike information criterion (AIC) is an asymptotically unbiase...

متن کامل

Park Estimation of Kullback – Leibler divergence by local likelihood

Motivated from the bandwidth selection problem in local likelihood density estimation and from the problem of assessing a final model chosen by a certain model selection procedure, we consider estimation of the Kullback–Leibler divergence. It is known that the best bandwidth choice for the local likelihood density estimator depends on the distance between the true density and the ‘vehicle’ para...

متن کامل

Using Kullback-Leibler distance for performance evaluation of search designs

This paper considers the search problem, introduced by Srivastava cite{Sr}. This is a model discrimination problem. In the context of search linear models, discrimination ability of search designs has been studied by several researchers. Some criteria have been developed to measure this capability, however, they are restricted in a sense of being able to work for searching only one possibl...

متن کامل

Bias of the corrected AIC criterion for underfitted regression and time series models

The Akaike Information Criterion, AIC (Akaike, 1973), and a bias-corrected version, Aicc (Sugiura, 1978; Hurvich & Tsai, 1989) are two methods for selection of regression and autoregressive models. Both criteria may be viewed as estimators of the expected Kullback-Leibler information. The bias of AIC and AICC is studied in the underfitting case, where none of the candidate models includes the t...

متن کامل

Tracking Interval for Type II Hybrid Censoring Scheme

The purpose of this paper is to obtain the tracking interval for difference of expected Kullback-Leibler risks of two models under Type II hybrid censoring scheme. This interval helps us to evaluate proposed models in comparison with each other. We drive a statistic which tracks the difference of expected Kullback–Leibler risks between maximum likelihood estimators of the distribution in two diff...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • IEEE Trans. Signal Processing

دوره 46  شماره 

صفحات  -

تاریخ انتشار 1998